Syntax-aware Semantic Role Labeling without Parsing
نویسندگان
چکیده
منابع مشابه
Syntax Aware LSTM model for Semantic Role Labeling
In Semantic Role Labeling (SRL) task, the tree structured dependency relation is rich in syntax information, but it is not well handled by existing models. In this paper, we propose Syntax Aware Long Short Time Memory (SA-LSTM). The structure of SA-LSTM changes according to dependency structure of each sentence, so that SA-LSTM can model the whole tree structure of dependency relation in an arc...
متن کاملSyntax Aware LSTM Model for Chinese Semantic Role Labeling
As for semantic role labeling (SRL) task, when it comes to utilizing parsing information, both traditional methods and recent recurrent neural network (RNN) based methods use the feature engineering way. In this paper, we propose Syntax Aware Long Short Time Memory(SALSTM). The structure of SA-LSTM modifies according to dependency parsing information in order to model parsing information direct...
متن کاملSemantic Role Labeling using Dependency Syntax
This document gives a brief introduction to the topic of Semantic Role Labeling using Dependency Syntax. We also describe a system that has been developed and tested on a corpus from the CoNLL-20081 shared task. We evaluate the system and give a short discussion on further improvements. Our results are reasonably good compared to those reached during the shared task.
متن کاملSemantic Role Labeling Improves Incremental Parsing
Incremental parsing is the task of assigning a syntactic structure to an input sentence as it unfolds word by word. Incremental parsing is more difficult than fullsentence parsing, as incomplete input increases ambiguity. Intuitively, an incremental parser that has access to semantic information should be able to reduce ambiguity by ruling out semantically implausible analyses, even for incompl...
متن کاملChinese Semantic Role Labeling with Shallow Parsing
Most existing systems for Chinese Semantic Role Labeling (SRL) make use of full syntactic parses. In this paper, we evaluate SRL methods that take partial parses as inputs. We first extend the study on Chinese shallow parsing presented in (Chen et al., 2006) by raising a set of additional features. On the basis of our shallow parser, we implement SRL systems which cast SRL as the classification...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Transactions of the Association for Computational Linguistics
سال: 2019
ISSN: 2307-387X
DOI: 10.1162/tacl_a_00272